Skip to content

RedTeamSubnet/dev-fingerprinter-challenge

Repository files navigation

Device Fingerprinter (DFP) Challenge Platform

Overview

Welcome to the Device Fingerprinter (DFP) Challenge Platform. This project is the official server that runs and evaluates submissions for the DFP challenge.

The primary goal of this challenge is to develop a browser SDK that can accurately and consistently identify different physical devices across various browser environments. To achieve this, your submission must generate a unique fingerprint that remains the same for a specific device, regardless of the browser used, while ensuring it is distinct from fingerprints generated by other physical devices.

This platform's purpose is to test a fingerprinter script's ability to maintain high internal consistency (the same ID for the same device) and high external uniqueness (different IDs for different devices).

How The Challenge Works

When you submit your solution for scoring, the following automated process occurs:

  1. Submission Received: The DFP server receives your fingerprinter.js script via an API call to the /score endpoint.
  2. SDK Distribution: The server sends your submitted script to the DFP proxy server to be served to the target devices.
  3. Multi-Browser Session: The server initiates a session involving multiple physical devices and several target browsers: chrome, brave, firefox-focus, duckduckgo, and safari.
  4. Device Notification: For each batch (a specific browser), the server sends an email notification to the physical devices. A human or automated process on those devices then opens the requested browser.
  5. Fingerprint Generation: Your fingerprinter.js script executes inside each browser on each device. It must analyze the device's characteristics and send a generated "fingerprint" string back to the server's /_fingerprint endpoint (via the proxy).
  6. Data Collection: The server collects all reported fingerprints, mapping them to the specific physical device and browser that generated them.
  7. Scoring: Once all tests are complete (or the timeout is reached), a final score is calculated based on the consistency of fingerprints for each physical device and the uniqueness of fingerprints between different devices.

Scoring System

The scoring is designed to reward both consistency (the same ID for the same device) and uniqueness (different IDs for different devices).

  • Internal Consistency (Fragmentation): A single physical device is expected to produce the same fingerprint across all browsers.
    • Fragmentation Penalty: Each unique fingerprint generated by the same physical device beyond the first incurs a penalty (default: -0.3).
    • Fragmentation Limit: If a single device generates too many unique fingerprints (default: 3 or more), its contribution to the final score is 0.0.
  • External Uniqueness (Collision): Different physical devices should never share the same fingerprint.
    • Two-Strike Collision Rule: If a specific fingerprint string is shared by two or more physical devices in multiple batches, those devices are considered to have failed the uniqueness test.
      • Strike 1: 1 batch with collision incurs a penalty (default: -0.25).
      • Strike 2: 2 or more batches with collision result in a score of 0.0 for those devices.
  • Minimum Devices: At least two unique physical devices must report fingerprints for the session to be valid. If fewer report, the final score is 0.0.
  • Final Score: The final score is the average score across all target physical devices, normalized between 0.0 and 1.0.

Local Testing & Submission

To test your solution locally, you first need to run the challenge server. Then, you must send an authenticated POST request to the /score endpoint.

The body of the request must be a JSON object containing your fingerprinter_js script.

Example of the JSON structure for your submission:

{
  "miner_input": {
    "random_val": "a1b2c3d4e5f6g7h8"
  },
  "miner_output": {
    "fingerprinter_js": "/* your javascript code to generate a unique fingerprint */"
  }
}

The API key for authentication is the REWARDING_SECRET_KEY value defined in your .env file.


Testing manuals

Setup and Installation

  1. Clone the repository.

  2. Create Environment Files: Copy the provided examples for your environment.

    # Copy the environment variable file
    cp .env.example .env
    
    # Copy the development docker override file
    cp ./templates/compose/compose.override.dev.yml ./compose.override.yml
  3. Customize Configuration: Edit the .env and compose.override.yml files to match your environment settings. Ensure your device configuration and SMTP settings are correct.

  4. Start the Server: Use the compose.sh script or standard Docker Compose commands.

    # Start docker compose
    ./compose.sh start -l
  5. Stop the Server:

    # Stop docker compose
    ./compose.sh stop

Configuration

The primary configuration is managed through environment variables in the .env file and src/api/configs/challenge.yml.

  • ENV: Sets the environment (e.g., LOCAL, PRODUCTION).
  • DEBUG: Set to true to enable debug mode.
  • DFP_API_PORT: The port the main API server will listen on.
  • REWARDING_SECRET_KEY: Important: This is the secret API key used to authenticate with the /score and /results endpoints.
  • DFP_CHALLENGE_SMTP_*: Configuration for the email notification system.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Packages

 
 
 

Contributors