I use the most recent available. But it is like this on any Mac I have, no matter what driver or gpu vendor. My theory goes like this: Renoise GUI on macOS uses some basic gpu compositing, but also a lot of CPU. Since 4k resolution, macOS comes with some kind of “auto-enlarge 4 times” mode. That enlarging seems to use the CPU, literally quadrupling each single pixel with CPU. Could be total nonsense, too.
Well 74% of CPU definitely tells that it’s effectively what’s going on with your 4k Mac, it’s the only logical explanation of such insane load…
Or probably analyzer bug.
But it’s clearly not how it should work
Without a compositor it’s not usable here. I don’t think this is normal.
I have an Intel/Haswell onboard graphic chip and the driver which is used is the i915 kernel driver and xorg- modesetting. But this graphic stuff completely confuses me on Linux.
have you tried limiting renoise GUI refresh rate after disabling compton? (GUI -> Global -> Limit…)
compton limits the redrawing to an amount set in your compton.conf (if it is 0 then it uses your screen refresh rate)
It was set to 60 fps and when I change it it doesn’t make any change. I didn’t 0 yet.
In my experience, onboad intel graphics sometimes tends to be even faster than discrete graphics when it comes to response time in 2d desktop, since the onboard gpu has faster access to normal ram or something.
Here´s a similar thread: Renoise on Linux via Wine is better than native linux renoise? Serious
… but no solution under the line.
Renoise on Linux via Wine is better than native linux renoise? Serious
The refresh rate has a small influence, but even with 5 fps I get xruns.
With ALSA (instead of JACK) I get the same high or a little bit lower DSP load, but no audible xruns when compositor is off. But the GUI is quite snappy and slow.
LOG file from Renoise with both Jack and Alsa running is here: https://paste.debian.net/1072901/
I’ve got unfortunately no idea what could go wrong here.
Renoise’s internal GFX and Window handling doesn’t change at all with a window manager’s composite mode - that’s up to the WM to deal with. The drawing itself in Renoise is basically (apart from all the internal stuff) a simple XPutImage from the Window backbuffer to the X11 window’s GC on Linux. The “compositor options” will apply after that, doing something with the data put into the GC, so it seems this the problem here.
While researching this: We could actually try to use the more “modern” XRenderComposite approach to do the updates (See https://email@example.com/msg05968.html). If this fixes your problem and/or creates problems is hard to say though. In general it seems to be faster…
Maybe Compton is just bad compoistor option?
I personally have tested 2 pretty different machines with Nvidia / Intel videocards and different combination of both proprietary and free drivers with:
- Manjaro Deepin (Arch base + custom variant of mutter compositor)
- Linux Deepin (Debian Unstable base + custom variant of mutter compositor)
- Manjaro KDE (Arch base + KWin compositor)
- Fedora 29 (Mutter i suppose, both X / Wayland)
- Arch Deepin (Arch base + Compiz)
Those are very different systems and compositors which have their pros and cons, quircks and bugs (especially mutter + nvidia), but one thing from my personal test was constant - Renoise performed outstanding in all cases, nothing like @lilith described…
I was really surprised / disappointed to see this…
So it must be DE / Compton issues.
Maybe, if there’s any room for test it’s a good idea to try out https://manjaro.org/download/xfce/ which uses more modern versions of XFCE / Compton / Kernel / Video drivers.
Debian is a little behind, so perhaps there was some bugs in XFCE / Compton that was fixed and could cause this?
thanks for commenting!! Just to clearify: There´s no problem when compton or the internal xfce compositor is running. I get the problem when no compositor is running.
" White researching this: We could actually try to use the more “modern” XRenderComposite approach to do the updates (See https://firstname.lastname@example.org/msg05968.html ). If this fixes your problem and/or creates problems is hard to say though. In general it seems to be faster…"
Is this something I can try or do you think about changeing something in Renoise?
I’m not expert but i never had issue with Renoise on Ubuntu 18.04 LTS (out of the box) on laptops t440/t450/t460/t470(/s), if that makes sense.
I also have no problems. With intel lapotop and nvidia workstation, both using open source drivers.
The idea to disable the compositor when realtime audio is running comes from bad gpu drivers that can sometimes mess with very tight set up realtime audio. I found this only relevant when using nvidia or amd closed source drivers, and that status changed over the years with driver updates. Open source drivers are much more “polite” in their operations, if you set up irq and realtime priorities properly there should be no problem with whatever gpu load is running and parallel realtime audio.
I’d still think that the difference might come from the wm not using gpu acceleration at all with compositing disabled. Then the problem would also be independent from renoise. Maybe there exists some benchmark or method to measure the wm performance?
Now I know a little bit more: I installed the performance tools in debian.
And the main difference between running renoise with compositor and without is a process called drm_clflush_pages(). When compositor is off the overhead (the number can be zeroed with the z key to show the difference or overhead) of this process compared to compositor on is 99% and it’s quite contant at this number. So I assume that this is responsible for the xruns.
I also found something here, but don’t have a clue at the moment what to do or to test:
edit: I’m on #intel-gfx and #dri-devel to get some help … hope someone answers
hypothesis from a guy @ #xorg
“my current hypothesis is that the program constantly redraws this region, probably more than 60fps. without composite each draw is turned into gpu operation and performed by the card. when composite is used, the draw operation is done in system RAM, and only on vsync it is drawn to the card.”
He also thought that it would be better to take XPutImageShm() instead of XPutImage().
It seems to be due to outdated Mesa 13 in Debian stable. It is very very likely solved in Mesa 18, which can be installed from the backports. This is what I’ll try in the next days …
So, have you tested it?
Sorry, i’m really keen to know
I believe your Mesa assumption should work / explain a lot)
I didn’t dare yet to upgrade mesa. I think I continue using Renoise with compton and if I still get freezes I might use MX Linux. I just booted a live stick and it looks really nice. It’s based on Debian Stable, but it uses more recent kernels and also mesa 18. I try to get renoise runiing on the stick and make a test if I still get the same issues.
MX Linux is legit. I definitely would recommend it. Their repo has a ton of up to date software that really complements the Debian base, without the headache of a gazillion PPAs and constant updates borking your install. Compiz seems to be the preferred compositor based on what I’ve read in the forums. (Compton and Compiz are both available in their package installer, which is simple but quite nice)
Reminds me of what happened when I tuned my workstation for renoise. I had weird “soft” x-runs…after lots of research and tuning and perf tracing it turned out to be the nvidia open source driver clinging to the cpu under certain circumstances, and updating the whole kernel fixed it (and with it xorg/nouveau), as some new version was improved to be more respectful to realtime/interrupt priorites…
I just was testing it on an older laptop with:
Graphics: Card: Intel Mobile 4 Series Integrated Graphics Controller
X.Org: 1.18.3 drivers: intel (unloaded: fbdev,vesa) Resolution: email@example.com
GLX Renderer: Mesa DRI Mobile Intel GM45 Express x86/MMX/SSE2 GLX Version: 2.1 Mesa 11.2.0
-> no issue here. It’s working perfectly.