I built Pysey — a free, open-source desktop emulator for the Critter & Guitari EYESY video synthesizer

I’ve had a Critter & Guitari EYESY for a while now. If you’re not familiar with it: it’s a black box video synthesizer that takes audio and MIDI input and outputs real-time visuals over HDMI. The visuals are generated by Python scripts called “modes” — each mode is a self-contained instrument that responds to audio in its own way. Five hardware knobs control parameters in real time. It’s genuinely great for live performance.

The workflow for developing new modes is not great. You write your Python script, you copy it to the EYESY over a network connection, you restart the mode system, you see if it works. If it doesn’t — which it often doesn’t on the first pass — you go back to your editor, make changes, and do it again. For anything with real visual complexity, that loop gets old fast.

So I built Pysey — a macOS desktop application that emulates the EYESY hardware. Write a mode, run it immediately in the emulator, see exactly what it looks like. No hardware required to develop, no round-trips to a Raspberry Pi to test.

What Pysey actually does

The core function is straightforward: Pysey runs EYESY modes in a desktop window, unmodified, on macOS. The same Python script that runs on your EYESY hardware runs in Pysey. You get the same audio reactivity, the same five virtual knobs, the same 30fps rendering loop — just on your laptop instead of the box.

Beyond the basic emulation, there are a few things I built in that make it more useful for performance as well as development:

  • Multiple video outputs. Syphon for routing into VJ software like Resolume, NDI for network broadcast, and OBS Virtual Camera so you can use EYESY visuals in Zoom, Teams, or OBS directly.
  • MIDI control. Connect a hardware MIDI controller and use it to drive the virtual knobs the same way you would with EYESY hardware.
  • Ableton Link. Tempo sync across Ableton Live, Traktor, or any of the 300+ apps that support Ableton Link — useful if you want your visuals synced to a live set.
  • Hardware connection. Pysey can connect to a real EYESY over SSH and push modes and scenes to it directly. You develop in the emulator and deploy to hardware without leaving the app.
  • Mode browser. Browse and install modes from the community, including anything on Patchstorage, without leaving the app.

It supports both EYESY OS versions — the original v2.3 (Python 2.7) and the current v3.1 (Python 3.11). It even has a compatibility shim that lets you run Python 2 modes on v3.1 without modifying them, which covers most of the existing community mode library. For what it’s worth, I prefer the EYESY OS v2.3, as the latest hardware OS has some minor regressions I don’t love.

The modes I built alongside it

Developing Pysey meant I also had a development environment to actually use. Over the course of the project I wrote 17 original visual modes covering a range of different algorithmic and aesthetic approaches: strange attractors, reaction-diffusion simulations, Lissajous curves, Whitney fans, Chladni resonance patterns, Voronoi tessellation, flow fields, Barnsley ferns, Stereo Interference patterns, and several others.

I also wrote two meta-modes — modes that dynamically load and cycle through other installed modes. One randomizes through your mode library with a retro VCR overlay. The other does the same thing but sweeps the virtual knobs through an LFO pattern in sync with tempo, so every mode in the rotation gets driven differently. They’re weird and I like them a lot.

The modes are available on Patchstorage under my guerrilladigital account.

Demo reels featuring Pysey Synth visuals

Note: there’s a bit of latency within the OBS recordings, so while the audio / video is off a bit, you can still get the idea.

Open source and free

Pysey is free and open source — GPL v3, on GitHub. I’m not an official member of the Apple Developer Program, so the .dmg could be signed and notarized for macOS. It installs without security warnings on modern Mac hardware. Apple Silicon and Intel both work.

It was announced both on the Critter & Guitari community forum and Reddit in February 2026. The response has been positive — people are using it, and the main feedback so far is that it works.

If you have an EYESY or you’ve been curious about EYESY-style visual synthesis and want to experiment without the hardware, you can download Pysey at pysey-synth.com.

Reference Resources & Further Reading

Here are some links if you want to go deeper on EYESY, the mode ecosystem, or the tools Pysey is built on.

Posted by Martin Defatte

There are no comments yet, add one below.

Leave a Comment