AI & ML interests

None defined yet.

Recent Activity

Zandy-Wandyย  updated a Space 5 days ago
Matrix-Corp/Orenthal
Zandy-Wandyย  published a Space 5 days ago
Matrix-Corp/Orenthal
Zandy-Wandyย  updated a Space 6 days ago
Matrix-Corp/README
View all activity

Organization Card

Matrix.Corp

Independent AI research organization building specialized models, frontier agentic systems, and new intelligence paradigms.

HuggingFace: Matrix-Corp ยท Founded by: Zandy-Wandy ยท GitHub: zapgaming


Status Legend

Badge Meaning
๐ŸŸข Released Weights available, ready to use
๐ŸŸก Preview Architecture published, training planned
๐Ÿ”ด Planned Design complete, not yet built
๐Ÿฉต Long-Term Vision defined, major research ahead
๐ŸŸฃ Closed Proprietary weights
โฌ› Deprecated Cancelled or superseded

Models

๐ŸŒŒ Zenith โ€” Reasoning + Emotional Intelligence

Status: ๐ŸŸก Preview ยท Target: Tenstorrent Blackhole p300a

Transformer models with a built-in EQ Engine โ€” a dedicated emotional intelligence layer that sits alongside the reasoning stack. Ring Attention (32K), MoE (12 experts top-2), Ollama + vLLM compatible.

Model Params Base Link
Zenith-7B-V1 7B Qwen2.5-Coder-7B โ†’
Zenith-28B-V1 28B Qwen3.5-27B (Opus 4.6 distilled) โ†’
Zenith-32B-V1 32B DeepSeek-R1-Distill-Qwen-32B โ†’
Zenith-70B-V1 70B DeepSeek-R1-Distill-Llama-70B โ†’

View Zenith Collection โ†’


๐Ÿ”ฌ Vortex Scientific โ€” Deep Science Reasoning

Status: ๐ŸŸก Preview ยท Target: MacBook M2/M3 + Nvidia 4060

Built from scratch โ€” no base model. Custom 50K science tokenizer. Hybrid SSM+Attention architecture with four domain-specific modules: Equation/LaTeX, Numerical, Citation, and Molecular/Periodic Table.

Model Params Link
Vortex-7B-V1 7B โ†’
Vortex-13B-V1 13B โ†’

View Vortex Collection โ†’


๐ŸŒฟ Touch Grass โ€” Music AI

Status: ๐ŸŸก Preview ยท Target: Any hardware

LoRA fine-tune on Qwen3.5 built for musicians. Tab & Chord Module, Music Theory Engine, Ear Training, EQ Adapter (4 emotional modes), Songwriting Module.

Model Params Base Link
TouchGrass-3B 3B Qwen3.5-3B-Instruct โ†’
TouchGrass-7B 7B Qwen3.5-7B-Instruct โ†’

View Touch Grass Collection โ†’


๐ŸŒ Matrix Lattice โ€” Frontier Agentic MoE

Status: ๐ŸŸข Released ยท ๐ŸŸฃ Closed Source ยท Target: 4โ€“32ร— H100 / Tenstorrent p300a

Shipped. Our largest and most capable system. Frontier-scale mixture-of-experts with 17 custom intelligence modules including: EQ Engine V2, Multi-Agent Coordination Layer (MACL), Hierarchical Context Compression Engine (HCCE), Causal Reasoning Graph, Long-Horizon Task Planner, Confidence Calibration Head, Safety Reasoning Module (SRM), and more. 1M token context across all tiers.

Model Total Params Active Params Experts Context Link
Lattice-120B 120B ~22B 64 top-4 1M โ†’
Lattice-430B 430B ~38B 128 top-4 1M โ†’
Lattice-671B 671B ~47B 256 top-4 1M โ†’

View Lattice Collection โ†’


๐Ÿฉธ Matrix ECHO โ€” Living Error Memory

Status: ๐Ÿ”ด Build In Progress ยท ๐ŸŸข Open Source ยท Language: Rust

The model that remembers how it was wrong.

ECHO is a 27B coding-focused LLM built on Jackrong/Qwen3.5-27B-Claude-4.6-Opus-Reasoning-Distilled, running fully in Rust via HuggingFace candle. Every correction it receives crystallizes into a Scar โ€” a typed, weighted memory object stored in a live petgraph lattice.

Before every response, ECHO scans its Scar lattice for similar past mistakes. The more it's corrected, the harder it is to fool. Mistakes are not erased โ€” they become assets.

Core loop:

prompt โ†’ pre-scan Scar lattice โ†’ inject caution context โ†’ generate โ†’ correction โ†’ new Scar forms

Scar types: Factual ยท Logical ยท Contextual ยท Hallucination ยท Overconfidence

Domain Weakness Map โ€” ECHO tracks which topics it's systematically weak in and suppresses confidence automatically in high-risk domains.

OpenAI-compatible API โ€” drop-in via POST /v1/chat/completions. Corrections via POST /v1/echo/correct.

Model Params Base Language
ECHO-27B-V1 27B Qwen3.5-27B (Opus 4.6 distilled) Rust + candle

View ECHO Collection โ†’


๐ŸŽจ Matrix Voxel โ€” 3D Generation

Status: ๐Ÿ”ด Planned ยท Target: A100 40GB

Flow-matching DiT backbone (~2.3B) with task-specific decoder heads. Generates 3D meshes, environments, printable models, and NeRF/Gaussian Splatting outputs.

Model Task Outputs License
Voxel Atlas World/environment gen .vox, .obj, .usd ๐ŸŸข Open
Voxel Forge 3D mesh & assets .obj, .glb, .fbx, .usdz ๐ŸŸข Open
Voxel Cast 3D printable .stl, .step, .3mf ๐ŸŸข Open
Voxel Lens NeRF / Gaussian Splatting .ply (3DGS) ๐ŸŸข Open
Voxel Prime Unified all-in-one All formats ๐ŸŸฃ Closed

๐Ÿ”ท Matrix Vexa โ€” Crystalline Intelligence Substrate

Status: ๐Ÿ”ด Paused ยท ๐ŸŸข Open Source

Vexa is not a model. It is a new intelligence paradigm โ€” a living lattice of Glyphs (structured meaning objects) that grows through Crystallization instead of training. 10 minutes on any CPU. No GPU required. Knowledge never goes stale โ€” three background threads continuously update from the web, interactions, and decay.

Full paradigm definition and build prompt complete. Build paused, will resume.

View Vexa Collection โ†’


โฌ› Kairiq โ€” Critical Moment Intelligence Module

Status: โฌ› Deprecated

A Lume-native intelligence amplifier module designed to wrap Matrix models. Deprecated โ€” the custom Lume language runtime exceeded practical build complexity. Core ideas (pre-scan, confidence suppression, domain routing) absorbed into ECHO.


Paradigms

Name Type Status
Crystalline Intelligence (Vexa) Non-neural knowledge substrate ๐Ÿ”ด Paused
Living Error Memory (ECHO) Scar-based mistake crystallization ๐Ÿ”ด Build In Progress
Ferric Attention Ownership-typed attention mechanism ๐Ÿฉต Research concept

Reserved Names

These names are allocated to specific projects. Not available for other uses.

Name Allocated To
Vexa Crystalline Intelligence Substrate
ECHO Living Error Memory LLM
Axiom Future extreme reasoning model (planned)
Lume Declarative-relational language for Vexa

Licensing

Model Family License
Zenith Apache 2.0
Vortex Apache 2.0
Touch Grass Apache 2.0
Matrix Lattice Proprietary
Matrix ECHO Apache 2.0
Matrix Voxel (open tiers) Apache 2.0
Matrix Voxel Prime Proprietary
Vexa Apache 2.0

Matrix.Corp โ€” building intelligence that knows its own limits.

datasets 0

None public yet