ACuriousFellow As an example, one could use mass adoption. If an app has millions of downloads it must be okay.
This is a bad assumption. The number of downloads is largely dependent on mass-appeal, marketing or peer pressure.
In general, I would even say it's very likely wrong, that popular apps are safer, and they do generally poorly in regards to privacy (the most popular apps are TikTok, Facebook, ...)
Another could be a trusted voice on a forum like this.
I'm not sure opaque voices on social media are a good way to verify privacy and security.
Or even reading every line of code in the source and every PR if its open source.
Auditing is usually a process that's highly regarded and can be done in a variety of ways. There are companies specializing in audits.
Open source generally makes audits easier, and easier to verify.
The one I'm interested in is someone who does these reviews for a living.
Audits of free and open source software can basically be done by anyone, either largely manually by reading the code, or by using a variety of auditing tools.
The toolkits used for auditing are mostly proprietary and exclusive to the auditing company.
You can hire those auditors yourself, pricing is mostly fairly transparent and depends on the amount of work done.
This can be reasonably affordable if you want only the automated checks, and very very expensive if you want manual code review.
I'm interested in tools that a security researcher / enterprise business or similar would use to scan source code and report on that app for its safety.
Those tools tend to be internal and not released to the public, since this would be impossible to finance.
Generally, software simply isn't audited, due to cost.
Enterprises almost never really audit software (only the most sensitive industries accept that cost, for example military contractors and governments).
Enterprises mostly fully depend on off-the-shelf anti-virus-software and some sandbox testing rounded off with strict firewall rules and the security tools offered by the operating system vendor (in most cases Microsoft, Google and Apple).
The kind of report that lists say known cves or known exploits, dodgy coding, tracking libraries etc etc .
As mentioned, there's not a lot of such software around. There are some basic open source checking tools, like the dependency-checkers for some programming language environments (rust e.g.).
There's also Metasploit, one of the better known open source tools, developed by Rapid7 which is one of those companies that you can hire for audits.
Commercial software is rarely audited, only some vendors do internal audits and require the customers to trust the auditing firm and scope of audit done.
Open source software is almost never audited, there are only very few exceptions.
Sometimes governments pay for audits for open source software, for example the German government does so regularly (Matrix and Element have been audited through the German information security agency).
The big commercial distributions also sometimes audit some of their core products (Red Hat and Co.).
Other than that, there's Google's Project Zero which does internal checking for a large number of open source projects, but little is known of their toolkit and methodology.
GrapheneOS itself has never been audited and security is largely based on trust in Google, the GrapheneOS developers and its open source nature (which the GrapheneOS developers themself don't consider a valuable indicator of security as far as I can tell).
As for privacy, there's even less checking. Privacy is generally not considered by business or software distributors.
There are some exceptions:
F-Droid does some basic checks and will actively remove code they consider problematic for privacy.
They are however also very limited in resources and have often been criticized for insufficient review and slow updates (whether that's true or not, I cannot tell).
There's also Accrescent, which is backed or at least recommended by the GrapheneOS developers, who also do some automatic checks and claim to do manual review if those checks fail or if privacy-sensitive permissions are used.
How exactly they do that, and how this is financed is unclear, there is no public information of the auditing process or results, so users have to trust Accrescent as well as the App-Developers (since review can only fail, but Accrescent will always fully use the code the app developer ships and not patch software like for example F-Droid or Linux Distributions will do).