Test UltiHash

This guide will show you how to set up a test environment for UltiHash.

You’ll set up a local environment using Docker Compose for container orchestration.

Please note that to test UltiHash, you need to sign up for a free account.

If you want to test UltiHash in a Kubernetes environment, you can do so with Minikube.

The main steps are as follows:

1

Install prerequisite tools

2

Set up UltiHash with Docker Compose

3

Integrate sample data + see space savings

This setup is intended for local testing - not production use.

For now, UltiHash is only supported on Linux. This guide provides commands to be run in your terminal, and assumes you're running Ubuntu LTS on an AMD64 (x86_64) architecture. Other distributions and ARM architectures should work fine, although some commands may need slight adjustment.

1. Install prerequisite tools

Before you start setting up the UltiHash cluster, you need some tools installed. If you already have any of these installed, you can simply skip that step.

1

Install Docker Engine

Docker provides a containerized virtual environment for UltiHash to run on.

You can find general instructions for installing Docker Engine at docs.docker.com/engine/install.

To quickly install, run:

# Linux installation: Update package index, install prerequisites, and set up Docker’s GPG key
sudo apt-get update
sudo apt-get install ca-certificates curl
sudo install -m 0755 -d /etc/apt/keyrings
sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg -o /etc/apt/keyrings/docker.asc
sudo chmod a+r /etc/apt/keyrings/docker.asc

# Add the Docker repository to Apt sources and update package index
echo \
  "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] https://download.docker.com/linux/ubuntu \
  $(. /etc/os-release && echo "$VERSION_CODENAME") stable" | \
  sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
sudo apt-get update

# Install Docker Engine, CLI, and related plugins
sudo apt-get install docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin

After installing Docker, you may need to add your user to the Docker group.

Run:

sudo usermod -aG docker $USER

Make sure to restart your computer at this stage to apply the group changes.

2

Install AWS CLI

The AWS CLI is a unified tool to manage AWS services from the command line.

You can find general instructions for installing the AWS CLI at docs.aws.amazon.com/cli/latest/userguide/getting-started-install.

To quickly install, run:

# Download and unzip AWS CLI installer
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
unzip awscliv2.zip

# Install AWS CLI
sudo ./aws/install

3

Install boto3 (and tdqm)

The Amazon Web Services (AWS) SDK for Python (often referred to as boto3, allows you to interact with AWS services programatically.

To install, run:

sudo apt install python3-boto3

tqdm is a Python package that provides a progress bar, which will be used in the upload scripts.

To install, run:

sudo apt install python3-tqdm

Done! You've successfully installed all the prerequisites for testing UltiHash.

Next, you'll set up your local cluster using Docker Compose.

2. Set up UltiHash with Docker Compose

Now you’ll set up the local UltiHash environment using Docker Compose. This involves authenticating with the UltiHash registry, downloading the necessary configuration file, and running the UltiHash services locally.

1

Set up authentication with the registry Before you can download and run UltiHash, you need to authenticate with the UltiHash registry. The registry is where the container images (required for running UltiHash) are stored.

For this step, you'll need these credentials from your UltiHash Dashboard:

  • Registry login

  • Registry password


Log in to the UltiHash registry with your credentials:

docker login registry.ultihash.io -u <registry-login>

Make sure to replace <registry-login> with the 'Registry login' from your Dashboard. When prompted for a password, enter the 'Registry password' from your Dashboard.

2

Download compose.yml

The compose.yml file is a Docker Compose configuration file that defines all the services, volumes, and settings needed to run UltiHash. In addition, policies.json defines the basic policies for the cluster.

Download both files:

Trouble downloading? Try right-clicking and selecting 'Save link as...' or similar.

3

Set up credentials and license

To enable access to UltiHash services, you need to export your credentials and license key. These environment variables will be used for authentication.

Run the following commands:

export AWS_ACCESS_KEY_ID="TEST-USER"
export AWS_SECRET_ACCESS_KEY="SECRET"
export UH_LICENSE_STRING="<license-key>"

Make sure to replace <license-key> with the 'License key' from your Dashboard.

4

Start UltiHash services Change the working directory to the folder where you saved compose.yml and policies.json. For example:

cd ~/Downloads

Start the UltiHash cluster:

docker compose up -d

If successful, Docker Compose will download the necessary images (if they’re not already cached) and start the UltiHash services.

Done!

You’ve successfully set up your local UltiHash cluster. Next, let's integrate sample data + see space savings.

3. Integrate sample data + see space savings

Now that UltiHash is running on your local cluster, let's integrate some sample data.

1

Prepare dataset

If you have a dataset you want to test already, you can skip this step.

Alternatively, you can download one of these datasets from Kaggle:

Remember to unzip your test dataset if you download it from Kaggle.

UltiHash's deduplication can have significantly different results depending on the dataset integrated. For testing, try datasets likely to contain repeated content - like document libraries with shared templates, multimedia collections with common graphics, or code repositories.

2

Create a bucket

Object storage systems like UltiHash use a top-level container called a bucket. To facilitate scalability, buckets don’t have a traditional hierarchical folder structure: instead, each object in a bucket has a unique key (which can resemble a file path, simulating directories).

To create a bucket, run:

aws s3api create-bucket --bucket <bucket-name> --endpoint-url http://127.0.0.1:8080

Make sure to replace <bucket-name> with your chosen bucket name, e.g. test-bucket.


You can see your newly created bucket by running:

aws s3api list-buckets --endpoint-url http://127.0.0.1:8080

3

Download scripts

We've prepared some scripts to make the testing process easier.

Download the following scripts for uploading and downloading:

Trouble downloading these scripts? Try right-clicking and selecting 'Save link as...' or similar.

4

Integrate sample data

Now that you have a bucket in which to put objects, let's use the upload script to integrate your sample data.

To integrate your dataset, run:

python3 <upload-script-path> --url http://127.0.0.1:8080 --bucket <bucket-name> <dataset-path>

Make sure to replace <upload-script-path> with the path to the upload script you downloaded, e.g. /home/user/Downloads/uh-upload.py.

Also replace <bucket-name> with your bucket name.

Finally, replace <dataset-path> with the path to the directory for the dataset you prepared or downloaded, e.g. /home/user/Downloads/test-dataset.

A bar should display the ongoing progress of your integration.


Once the integration is complete, you can run the following command to see your objects:

aws s3api list-objects --endpoint-url http://127.0.0.1:8080 --bucket <bucket-name> --output text | cat

Make sure to replace <bucket-name> with your bucket name.


You can also download an entire bucket by running:

python3 <download-script-path> --url http://127.0.0.1:8080 --path <destination-path> <bucket-name>

Make sure to replace <download-script-path> with the path to the upload script you downloaded, e.g. /home/user/Downloads/uh-download.py.

Also replace <destination-path> with the path to the directory you want to download the bucket to, e.g. /home/user/Downloads.

Finally, replace <bucket-name> with the name of the bucket to download.

5

See space savings in your cluster

You can see the storage space UltiHash is saving across the entire cluster by running the uh-see-space-savings script:

python3 <see-space-savings-script-path> --url http://127.0.0.1:8080

Make sure to replace <see-space-savings-script-path> with the path to the upload script you downloaded, e.g. /home/user/Downloads/uh-see-space-savings.py.

Done! You’ve successfully integrated a dataset to a local test cluster, and can see the space saved by UltiHash's built-in deduplication.

Last updated