One of my work directions as an SEO is tech SEO and technical audits. Mostly do it for big websites with a professional tool kit. As for the audit, a part of my check list is:
* Indexability – if all pages are indexed and available for indexing. If they are not – figure out why
* Speed – to understand if there are any loading speed issues. It’s an important ranking factor and affects conversions
* Interlinking – find poorly interlinked or orphaned pages, Crawl distance issues, follow/no-follow issues
* Duplicates – check if there any duplications in titles, descriptions, h1 and content itself, important ranking factor
* Content – if there is thin or low content, mixed content (http at https website)
SEO effective pages – find most and less SEO efficient pages, if there’s crawl budget waste or orphaned pages
* Log files analysis – to see which bots visit your website, why and how often; to see which pages Google bots like and which not, and find out why.
* Crawlability – check if pages available for bots, and if they’re not – why, which status codes they get, if there’s any harmful redirect chains etc.
* Bots’ behavior – check the correlations between bots visits and Crawl depth, Interlinking level, content size, indexability, duplicates etc.
If you’re interested, I can make something like a test check and give you a free short review.