Bad performance without compositing - Linux

What is also funny: When I cover the Renoise analyzer region with any other window (explorer or whatever) the CPU usage decreases. :grimacing:

I am on macOS with a 4k resolution. The GUI looks like drawn completely by CPU here. If I switch to fullhd or make the Renoise windows very small, I only have like a quarter of the usage. So I assume that’s a proof that the GUI is drawn by CPU?

That sounds like a decent, modern windows manager to me. macOS also has such a feature, covered areas won’t be rendered anymore, but only if the software supports it. Actually seems to me that the Renoise GUI code is much better optimized for Linux than it is for macOS, and uses some hardware acceleration over there? I have no idea, only Taktik could enlighten us.

1 Like

Yeah…It seems so, but probably it shouldn’t?

Since i don’t have it for example not on Linux, not on Windows…

What video-drivers do you guys use?

I use the most recent available. But it is like this on any Mac I have, no matter what driver or gpu vendor. My theory goes like this: Renoise GUI on macOS uses some basic gpu compositing, but also a lot of CPU. Since 4k resolution, macOS comes with some kind of “auto-enlarge 4 times” mode. That enlarging seems to use the CPU, literally quadrupling each single pixel with CPU. Could be total nonsense, too.

Well 74% of CPU definitely tells that it’s effectively what’s going on with your 4k Mac, it’s the only logical explanation of such insane load…

Or probably analyzer bug.

But it’s clearly not how it should work :woozy_face:

Without a compositor it’s not usable here. I don’t think this is normal.

I have an Intel/Haswell onboard graphic chip and the driver which is used is the i915 kernel driver and xorg- modesetting. But this graphic stuff completely confuses me on Linux.

lilith

have you tried limiting renoise GUI refresh rate after disabling compton? (GUI → Global → Limit…)

compton limits the redrawing to an amount set in your compton.conf (if it is 0 then it uses your screen refresh rate)

It was set to 60 fps and when I change it it doesn’t make any change. I didn’t 0 yet.

In my experience, onboad intel graphics sometimes tends to be even faster than discrete graphics when it comes to response time in 2d desktop, since the onboard gpu has faster access to normal ram or something.

Here´s a similar thread: Renoise on Linux via Wine is better than native linux renoise? Serious

… but no solution under the line.

The refresh rate has a small influence, but even with 5 fps I get xruns.
With ALSA (instead of JACK) I get the same high or a little bit lower DSP load, but no audible xruns when compositor is off. But the GUI is quite snappy and slow.

LOG file from Renoise with both Jack and Alsa running is here: https://paste.debian.net/1072901/

I’ve got unfortunately no idea what could go wrong here.

Renoise’s internal GFX and Window handling doesn’t change at all with a window manager’s composite mode - that’s up to the WM to deal with. The drawing itself in Renoise is basically (apart from all the internal stuff) a simple XPutImage from the Window backbuffer to the X11 window’s GC on Linux. The “compositor options” will apply after that, doing something with the data put into the GC, so it seems this the problem here.


While researching this: We could actually try to use the more “modern” XRenderComposite approach to do the updates (See https://www.mail-archive.com/xorg@lists.freedesktop.org/msg05968.html). If this fixes your problem and/or creates problems is hard to say though. In general it seems to be faster…

2 Likes

Maybe Compton is just bad compoistor option?

I personally have tested 2 pretty different machines with Nvidia / Intel videocards and different combination of both proprietary and free drivers with:

  1. Manjaro Deepin (Arch base + custom variant of mutter compositor)
  2. Linux Deepin (Debian Unstable base + custom variant of mutter compositor)
  3. Manjaro KDE (Arch base + KWin compositor)
  4. Fedora 29 (Mutter i suppose, both X / Wayland)
  5. Arch Deepin (Arch base + Compiz)

Those are very different systems and compositors which have their pros and cons, quircks and bugs (especially mutter + nvidia), but one thing from my personal test was constant - Renoise performed outstanding in all cases, nothing like @lilith described…

I was really surprised / disappointed to see this…

So it must be DE / Compton issues.
Maybe, if there’s any room for test it’s a good idea to try out https://manjaro.org/download/xfce/ which uses more modern versions of XFCE / Compton / Kernel / Video drivers.

Debian is a little behind, so perhaps there was some bugs in XFCE / Compton that was fixed and could cause this?

Hi taktik,

thanks for commenting!! Just to clearify: There´s no problem when compton or the internal xfce compositor is running. I get the problem when no compositor is running.

" White researching this: We could actually try to use the more “modern” XRenderComposite approach to do the updates (See https://www.mail-archive.com/xorg@lists.freedesktop.org/msg05968.html ). If this fixes your problem and/or creates problems is hard to say though. In general it seems to be faster…"

Is this something I can try or do you think about changeing something in Renoise?

I’m not expert but i never had issue with Renoise on Ubuntu 18.04 LTS (out of the box) on laptops t440/t450/t460/t470(/s), if that makes sense.

I also have no problems. With intel lapotop and nvidia workstation, both using open source drivers.

The idea to disable the compositor when realtime audio is running comes from bad gpu drivers that can sometimes mess with very tight set up realtime audio. I found this only relevant when using nvidia or amd closed source drivers, and that status changed over the years with driver updates. Open source drivers are much more “polite” in their operations, if you set up irq and realtime priorities properly there should be no problem with whatever gpu load is running and parallel realtime audio.

I’d still think that the difference might come from the wm not using gpu acceleration at all with compositing disabled. Then the problem would also be independent from renoise. Maybe there exists some benchmark or method to measure the wm performance?

2 Likes

Now I know a little bit more: I installed the performance tools in debian.
Then

su
top perf

And the main difference between running renoise with compositor and without is a process called drm_clflush_pages(). When compositor is off the overhead (the number can be zeroed with the z key to show the difference or overhead) of this process compared to compositor on is 99% and it’s quite contant at this number. So I assume that this is responsible for the xruns.

http://i.imgur.com/jgzYOr2.png"

I also found something here, but don’t have a clue at the moment what to do or to test:

https://dri.freedesktop.org/wiki/IntelPerformanceTuning/

edit: I’m on #intel-gfx and #dri-devel to get some help … hope someone answers :smiley:

hypothesis from a guy @ #xorg

“my current hypothesis is that the program constantly redraws this region, probably more than 60fps. without composite each draw is turned into gpu operation and performed by the card. when composite is used, the draw operation is done in system RAM, and only on vsync it is drawn to the card.”

He also thought that it would be better to take XPutImageShm() instead of XPutImage().

2 Likes

It seems to be due to outdated Mesa 13 in Debian stable. It is very very likely solved in Mesa 18, which can be installed from the backports. This is what I’ll try in the next days …

So, have you tested it?
Sorry, i’m really keen to know :thinking:

I believe your Mesa assumption should work / explain a lot)