There are still agp cards sold new, who knows for what wisdom, probably for stuff like your task - trying to beef up old pcs.
My guess would be to try to get an amd/ati card, and use the slim and pretty advanced open source drivers. But you should better do some research on your own on how the situation is for agp and the specific model you are looking at, and what aspects would be deprecated with modern kernels and thus maybe already flaky in operation. Keep in mind that even in linux there are projects dropped from focus in development to bare maintainance because nobody uses them anymore, and then the bitrot starts kickin’.
I’d also guess you should just expect slow graphics. AGP is very slow, even in 2d for high resolutions, it is just slow, and I guess also old cpus will be more of bottleneck for the gfx than the bus would be. I dabbled in such an attic pc with linux some years ago, these things are just snails and will never become speedy, period. They work, but even some 2nd hand more recent office model bought at the price point of the agp graphics card alone will be a rocket in comparison. I think 3d is unimportant for renoise (yet), so the smallest (and also coolest and quietest and least power hungry) model should suffice.
Out of my head I would also assume that single core machines are not really good for renoise, because gui and the privileged audio/dsp are fighting somewhat, to the loss for the gui. I have a hexacore machine, but allow renoise only 5 cores for dsp partially because the gui keeps fluent on heavy load this way.
To gain power for the dsp and tune latency in linux, use a good lowlatency or realtime kernel with realtime privileges enabled, disable hyperthreading, disable cpu reclocking, set a low “cpu_dma latency” that will prevent the longer state switches, (for disbling reclocking and dma_latency state switches please monitor the heat around the cpu, it can get very hot this way…), privilege your audio, involved clocks, and input interrupt devices over everything else - especially over the graphics - with the rtirq script. Then test the scenario with cyclictest (in my ubuntu the “rt-tests” package, or rt-tools from source for each custom kernel) in smp mode, at priority 95 where the renoise dsp threads will be as dry test and test at prio 96 while renoise is running to see if the µsecs are low in these states - when the cyclictest latency with proper tests is still high, deeper debugging must start, I once had the open nvidia driver steal chunks in the thousand of µs realm which of course sucks and needs to be tackled. Don’t expect this to make renoise run in the one digit ms latency realm, renoise isn’t really slippy realtime enabled, but the better and more stable the bed you place it in is, the more performance you will be able to squeeze out of it without dropouts.