Getting Started
Everything you need to install, configure, and write your first research protocol with the Bloodhound framework.
Installation
# Clone the repository git clone https://github.com/bloodhound-framework/bloodhound.git cd bloodhound # Install Python dependencies pip install -e . # Build the Rust core (optional, for VM runtime) cargo build --release # Verify installation python -m st_hurbet.validation.run_validation
- •Python 3.10+
- •Rust 1.75+ (for VM core, optional)
- •NumPy, SciPy, Pandas (installed automatically)
- •PyTorch (for domain compilers, optional)
Project Structure
bloodhound/ ├── st_hurbet/ # Core execution engine │ ├── validation/ # Validation & reference implementations │ │ ├── s_entropy.py # S-entropy coordinate system │ │ ├── ternary.py # Ternary representation & addressing │ │ ├── trajectory.py # Trajectory navigation │ │ ├── categorical_memory.py # Memory hierarchy │ │ ├── maxwell_demon.py # Zero-cost sorting controller │ │ ├── distributed.py # Network coordination layer │ │ ├── enhancement.py # Temporal precision enhancement │ │ └── run_validation.py # Full validation suite │ ├── docs/ # Research papers & documentation │ └── publication/ # Publication materials │ ├── src/ │ └── bloodhound_vm_core/ # Rust VM runtime │ ├── consciousness.rs # Consciousness-aware processing │ ├── entropy.rs # S-entropy implementation │ ├── oscillatory.rs # Oscillatory dynamics │ └── runtime.rs # VM runtime loop │ ├── backend/ # Python backend & APIs ├── frontend/ # React visualization frontend ├── docs/ # Extended documentation ├── bloodhound.toml # Main configuration ├── Cargo.toml # Rust workspace └── pyproject.toml # Python project config
Triangle DSL Reference
Triangle is the domain-specific language for specifying research protocols. Each statement maps to a morphism chain through S-entropy space. The language is designed around navigation, not computation — you specify what to investigate and what evidence constitutes convergence.
S(0.5, 0.3, 0.2) # Direct S-entropy coordinate S.012.201.100 # Trit address (depth 9)
# Extract specific data from a source genotype = slice genomics.ACTN3 @ cohort(elite_sprinters) @ variant(rs1815739) # Mass spectrometry extraction spectrum = slice metabolomics @ mz(400..600) @ rt(12.5..13.2)
# Compose two understanding fragments joined = compose genotype with cardiac preserving athlete_id # Multi-source composition integrated = compose genomics with proteomics with transcriptomics preserving gene_id
# Navigate to target with completion condition result = navigate joined to target via correlation_analysis # Completion conditions complete when distance < epsilon complete at depth 12 complete when confidence > 0.95 converge at confidence > 0.95
# Extract from multiple sources simultaneously parallel { hrv = slice biometrics.hrv @ cohort(elite_sprinters) genes = slice genomics.ACTN3 @ cohort(elite_sprinters) }
investigate "Association between ACTN3 genotype and cardiac adaptation in elite sprinters" with confidence > 0.95 with significance < 0.01 parallel { genotype = slice genomics.ACTN3 @ cohort(elite_sprinters) @ variant(rs1815739) cardiac = slice echocardiography @ cohort(elite_sprinters) @ measure(LV_mass, EF, GLS) protein = slice proteomics @ target(alpha_actinin_3) @ tissue(cardiac_muscle) } joined = compose genotype with cardiac preserving athlete_id result = navigate joined to target via correlation_analysis converge at confidence > 0.95
Python API
from st_hurbet.validation.s_entropy import SCoordinate, SEntropyCore # Create coordinates in bounded [0,1]³ space start = SCoordinate(s_k=0.1, s_t=0.2, s_e=0.3) target = SCoordinate(s_k=0.8, s_t=0.7, s_e=0.9) # Calculate categorical distance core = SEntropyCore() d = core.categorical_distance(start, target) print(f"Categorical distance: {d}")
from st_hurbet.validation.trajectory import TrajectoryNavigator # Navigate from start to target navigator = TrajectoryNavigator(epsilon=1e-3) trajectory = navigator.navigate(start, target) # The trajectory IS the address print(f"Address: {trajectory.address}") print(f"Path length: {trajectory.length()}")
from st_hurbet.validation.categorical_memory import CategoricalMemory # Create hierarchical memory memory = CategoricalMemory(depth=6) # Store at S-entropy coordinate memory.store(coord, data) # Retrieve — tier determined by categorical distance result = memory.retrieve(query_coord)
Core API Reference
A point in bounded [0,1]³ S-entropy space. All three coordinates must be in [0, 1]. Represents the entropy state of an information fragment.
Compute the categorical distance between two S-coordinates. Independent of Euclidean distance. Used for memory tier assignment and trajectory completion detection.
Navigate from start to target through S-entropy space. Returns a Trajectory object encoding the path, which simultaneously serves as the address and result identifier.
Store data at an S-entropy coordinate. Automatically assigns to the correct memory tier based on categorical distance. Returns the ternary address.
Encode an S-coordinate as a ternary address at the specified depth. Bijective mapping: each address maps to exactly one cell in the 3^k partition.
Configuration
The main configuration file is bloodhound.toml in the project root. Key sections:
[s_entropy] enable_navigation = true coordinate_precision = 1e-15 endpoint_prediction = true zero_time_computation = true knowledge_dimension_weight = 0.4 time_dimension_weight = 0.3 entropy_dimension_weight = 0.3
[consciousness] bmd_frame_selection = true semantic_understanding = true recursive_self_awareness = true consciousness_loops = true
[purpose_framework] enable_framework = true domain_learning = true enhanced_distillation = true adaptation_precision = 1e-12 information_density_target = 2.5
[combine_harvester] enable_framework = true multi_domain_integration = true router_algorithms = ["keyword", "embedding", "classifier", "llm"] optimal_routing = "embedding_based"
Running Validation
python -m st_hurbet.validation.run_validation # Expected output: 10 theorems verified # ✓ Triple Equivalence # ✓ Trit-Cell Correspondence # ✓ Trajectory-Position Identity # ✓ Completion Equivalence # ✓ Zero-Cost Sorting # ✓ Observable Commutation # ✓ Exponential Decay # ✓ Central State Impossibility # ✓ Distance Independence # ✓ Continuous Emergence
The validation suite tests all core theorems against the reference Python implementation. Each test is deterministic and self-contained. No external data or API access is required for the theorem validation. The ACTN3 end-to-end validation (on the Validation page) queries live public APIs.
Next Steps
Architecture Deep-Dive
Understand the three-layer system, S-entropy coordinates, and distributed coordination.
Use Cases
See how the framework applies to genomics, metabolomics, clinical imaging, and more.
Validation Results
Review the empirical evidence: 7/7 checks passed on the ACTN3 multi-omics study.
Collaborate
Find the right partnership track: research, domain compilers, infrastructure, or funding.