Homu Interpreting your linked reply, am I to assume both are impossible currently (without Shizuku or other major security-nullifying options) and also extremely unlikely to be supported due to their "UI tweaking" natures?
I don't believe either of those features is built into AOSP. Lots of Android vendors add special UI features, such as shake-to-flashlight, double-tap-to-lock-screen, and so on. Typically the code for those is not open-source. At any point, Google might choose to reimplement any of those features in AOSP. In theory they are working away on a "desktop mode" for AOSP, similar to Samsung's DEX (but in practice it's not clear when the Google version will be usable).
Personally I would classify both of those features as UI tweaks.
Homu Are you confident with this response, or do you think it could technically be possible using third-party applications (would a feature of this nature be directly inhibited by GrapheneOS)?
In general, giving an application the ability to intercept screen taps and button presses poses security risks. There are some legitimate reasons, falling under the category of assistive technology, but apps using those "accessibility" hooks need special permissions which Google has recently made more inconvenient to enable (because malware has tricked people into enabling those features). A recent exciting example of the risks posed by malicious apps hijacking user input is "TapTrap".
For sure I have not done an exhaustive search of all Android apps that might provide the tweaks you hope for! But it sounds as if your own research turned up only a solution based on Shizuku, which provides at least faint evidence that no "regular app" exists, and also no assistive app.
Overall: it is considered dangerous for apps to intercept user input, because it is dangerous; there are limited exceptions, which are deliberately difficult to activate; it's fairly unlikely that the GrapheneOS developers will build this sort of thing into the system any time soon.