
An analog spiking neural network chip that runs AI at microwatt power, thousands of times more efficiently than conventional processors.
Data centres in Ireland already consume over 20% of the country's electricity. Frontier AI labs are building gigawatt-scale facilities that rival entire cities. And leaders in the space are clear: this is accelerating, not slowing down.
Every neuron fires, every connection computes. All at once, every cycle.
A smooth, continuous stream of numbers, always flowing.
A rigid clock. Everything in lockstep.
Like leaving every light in a building on to read one book.
Only active neurons fire. The rest stay silent.
Sharp spikes, only when something happens. Silence is information.
No clock. Neurons respond when events arrive.
Like your brain: billions of neurons, only a fraction active at any moment.

Input currents charge a membrane capacitor. Weights are resistances: I = V/R. Multiple inputs sum via Kirchhoff's current law.
A parallel resistor drains the capacitor with tau_m = R*C. Exponential decay makes the neuron forget old inputs.
Membrane crosses 0.8V, a comparator (LMV7219) fires a spike. Propagates to downstream neurons as current injection.
An analog switch (SN74LVC1G66) shorts the capacitor to ground. Brief refractory period before it can fire again.
Designing analog hardware means simulating real physics. SPICE takes 30 minutes per inference. We need millions. Gilgamesh makes training possible.
One MNIST inference through our best small-scale 36-12-10 spiking neural network.
Training requires tens of thousands of these simulations. At 30 minutes each, SPICE would take years. Gilgamesh makes it possible in hours.
We built a proof-of-concept board to validate the architecture. The results exceeded expectations. Now we are manufacturing a dedicated 49-9-10 chip.
Handwritten digits fed as analog voltages. Classified by counting output spikes across 10 output neurons.
The manufacturing target is a 49-9-10 network: 49 inputs, 9 hidden LIF neurons, 10 outputs. Each synapse is a physical resistor.
The analog core draws ~30 microwatts during inference. When idle, it is fully off -- zero standby drain. Not sleeping. Off.
A 7x7 pixel image goes in as analog voltages. Spikes come out. The neuron that fires most wins. The whole thing runs on microwatts.

Implants running neural inference on microwatts, powered by body heat. No batteries to replace.
Always-on anomaly detection for infrastructure. Stick a sensor on and forget about it for years.
Edge inference at nanojoule scale. Drones, robots, and IoT that think locally.
MNIST handwritten digit classification is the standard benchmark for proving AI hardware feasibility. These results validate that Tarski's analog circuits can learn and classify, not a production deployment target.
| Architecture | Image | Params | Mode | Accuracy | |
|---|---|---|---|---|---|
| 36-6-10 | 6x6 | 276 | Physics | 85.05% | |
| 36-12-10 | 6x6 | 552 | Physics | 91.38% | |
| 49-9-10 | 7x7 | 531 | Physics | 90.14% | MANUFACTURING |
| 36-12-10 | 6x6 | 552 | Physics | 96.43% | BEST |
Third-year EEE student at University of Galway. Built the Gilgamesh simulator, designed the network architecture, and wrote the training pipeline. Runs Eltrus Limited, a medical software company serving 400+ patients.
Hardware co-designer on the 22,000-component PCB layout and assembly. Responsible for the physical neuron circuits, component selection, and board-level integration.