What would you want AI to do in Renoise?

Not yet, but question, couldn’t;

Bridge Application - Python service that handles communication between Renoise and AI models”

…be made inside the renoise tool with lua, is python necessary or just convenient as these kinds of frameworks are already out there?

fun, gimme

Renoise AI Assistant - My First Coding Project

Hey Renoise fam! After years of tracker wizardry, I’m diving into coding with this AI assistant for our beloved Renoise. [1]

The Technical Stuff (Still Learning)

I’m figuring out this Python thing as a bridge between Renoise and AI models. Lua scripting in Renoise is familiar territory, but this Python business is new to me. Seems like the smart approach though. [2]

What It Should Do (When I Get It Working)

  • Turn text descriptions into actual patterns (mind-blowing!)
  • Help organize my chaotic sample collection
  • Suggest sound design tweaks when I’m stuck
  • Maybe fix my terrible mixing? [3]

Current Status

Total work-in-progress. The Renoise part I’ve got handled (obviously), but still wrestling with the AI communication bits. Learning as I go! [4]

Repo link: GitHub - nickcent/Renoise-AI-Assistant: Control Renoise with natural language using AI

If any coding ninjas want to help a Renoise veteran/programming rookie, I’d appreciate feedback! But if you want Renoise workflow tips in return, I’ve got you covered. [5]


Even this post was made with AI , obviously… I’m not familiar with LUA either :stuck_out_tongue_winking_eye:

4 Likes

We need to make RENOISE MCP.

1 Like

Has this thread ended up in AI generated posts?

The robots are apparently excited about playing with Renoise.

1 Like

Only AI listeners can listen song created by AI.

That’s a nice tool start, wow. Might be especially interesting for people with disabilities. On the other side I think such a thing should be thought more in a bird’s perspective. Doesn’t the recent macos versions already provide such functionality, too? And what about Windows?

Well, MIDI 2.0 CI and property exchange is made for this. It is already available in Korg’s Keystage and their own recent VSTis. Sadly in a market without an authority there are no common standards. I actually don’t think using AI is the right approach, since this is all about simplyfing and standardizing the APIs. Also Renoise could implement very simple workflow improvements here, like Bitwig’s recent “touch control with mouse + move knob, done” mode. I think it is not about the discovery of possible parameters at all. Instead, AI would over complicate things here.

Personally I would be really interested in AI driven composing assistance, not only in Renoise. So for example you have a melody and tell it “generate some chords for it”, or “generate an alternative version”, “adapt melody to drum rhythm”. Such stuff, very integrated in the DAW, and always only as a suggestion, similar to recording multiple takes.

I have no idea how advanced current AI is regarding composition.

Renoise as it is is basically a perfect DAW, any additional features at this point are “nice to haves” besides some minor tweaks / bug fixes
That’s really the appeal of Renoise in many ways, its well balanced.
If it was something like Ableton, a huge company with massive overhead, they would be forced to bloat it out to keep up with quotas etc.

Ive been using Renoise since I was 15 (about 15 years) and I still find features that I haven’t used. I write music in Renoise faster than anything else.

You can already make generative music with a myriad of different programs and then sample them into Renoise

I guess the draw of this new wave of AI would be to learn from your patterns etc and create something based off of that, but whats the point really? It seems like hopping on a bandwagon that isn’t necessary.

Now if you could control Renoise via a BCI, thats a different story…

2 Likes

You can already make generative music with just renoise, too :slight_smile: Yxx can do a lot, especially when nested in phrases that are key mapped and then given their own Yxx commands…

everybody is so angry on ai
i want ai to make patterns for me that i describe as i can use in cursor ai/ avante - neovim

I salute your effort, that’s your thing, but I’m going to go and cry now

You are referencing .xrnx file in the docs but I don’t see it anywhere.

he hasn’t wrapped this into a xrnx by using github actions, yet.
@cent_gr

1 Like

I’m new to vibe coding. This is my first github repo, but it’s not the easiest first project. I have been spending all my time on learning and vibe coding. I will give it my all to make software, whatever that means. This project will defo continue. The tools in renoise get installed even if not as .xrnx format. You can drag’n’drop the Tool folder, including the manifest.xml and it should work. Again, it’s my first repo ever. I have to learn the Git and all…

here’s a github action that i use to wrap a xrnx out

the difference between this and what you have, is that i zip the whole repository into a xrnx
but you must modify it to only zip a folder into a xrnx.

Would implementing an MCP server produce better results than this?
I understand that the Renoise AI Assistant generates and eval Lua code that can be executed by Renoise. I would expect that MCP would not generate uncompilable code, but would directly perform more meaningful actions… I don’t know if it will work.

Local trained LLM + MCP for external LLM (agentic workflow with dockerized n8n)
Something along these lines. I’m learning… WE will get this done.
This is the new norm

Sometimes I use AI to create a specific sound. Elevenlabs has a sound generator tool and if you ask it to create an 80s cowbell sound with ridiculous reverb, the result is somehow quite nice.

My main issue with AI in software is: It usually utilizes cloudservices or takes very, very, very much GPU/CPU-power if run locally.

I use linux because I don’t like clouds and software which phones home every second. If renoise would have an AI integrated, I would probably not upgrade to that version. (Oh and I am a Renoise user since over 25 years now)