Skip to main content

Quick Start

This Quick Start guide shows how to run your first Bacalhau job with minimal setup. Bacalhau's design as a single self-contained binary makes it incredibly easy to set up your own distributed compute network in minutes.

Prerequisites

  • Docker installed on any machine that runs a compute node
  • Bacalhau CLI installed (see below)

1. Installation

  1. Install Bacalhau using the one-liner below (Linux/macOS) or see the installation guide for Windows and Docker options.
curl -sL https://get.bacalhau.org/install.sh | bash
  1. Once installed, verify with:
bacalhau version

2. Start a Hybrid Node

Open a terminal and run:

bacalhau serve --orchestrator --compute
  • This command launches both an orchestrator and a compute node in one process
  • Keep it running; you'll see logs indicating it's ready

3. Submit a Data Analysis Job

Bacalhau supports two primary methods of job submission: Imperative (CLI) and Declarative (YAML). We'll demonstrate a word count job on the classic novel Moby Dick.

bacalhau docker run \
--input https://www.gutenberg.org/files/2701/2701-0.txt:/data/moby-dick.txt \
--output outputs:/outputs \
--publisher local \
ghcr.io/bacalhau-project/word-count:latest -- --output-file /outputs/moby-dick-counts.txt /data/moby-dick.txt
  • The job downloads a sample dataset and processes it locally
  • Bacalhau will display job progress until completion
  • You'll receive a Job ID once the job is submitted

4. Inspect the Job

bacalhau job describe <jobID>
  • Replace <jobID> with the actual ID printed in step 2
  • You can run bacalhau job logs <job-id> to just get the execution logs

5. Retrieve Results

Download and view your job results:

# Download the results
bacalhau job get <jobID>

# View the analysis output
head job-*/outputs/moby-dick-counts.txt
note

You should see a word frequency analysis of the Moby Dick text file!

🎉 Success!

You've just:

  1. Started a local Bacalhau network
  2. Submitted a job using both imperative and declarative methods
  3. Tracked job progress with detailed descriptions
  4. Retrieved and viewed job results