Managing the volume and quality of links in to links out for each page represents a critical metric of internal linking and on page SEO link equity. This alone is a predominant variable in SEO which can either consolidate or diffuse additional layers of optimization and directly impact indexation and / or rankings.
Sculpting link equity within a website is what distinguishes that site from others targeting the same terms using SEO. For example, say for instance there are two websites fighting for a dominant position in search engines with nearly identical off page considerations.
Let’s assume Site A has 100 pages and relies on the homepage as the primary page and houses 90% of the inbound links to the site and then randomly distributes (or rather squanders) the link equity in the site by relying on the home page and primary navigation.
On the other hand, let’s assume Site B has 100 pages with 9 sub folders that use relevant naming conventions, SEO-friendly URL’s (website.com/descriptive-folder/main-keyword.html) and 10 pages in the root folder. Site B also uses contextual internal links to link from relevant page to relevant page whenever a keyword appears on a page in the site (to thereby sculpt an internal link preference for each keyword).
Also, site B does not have 90% of the off page ranking factor pointing at the homepage (*which is often the least likely page for conversion), on the contrary, links are distributed evenly throughout the site to either the individual pages with the highest contextual relevance to the keywords in question or the sub folder which houses the content (10 pages or semantically related content) which then redistributes that link equity to the pages as needed.
Which website do you think will rank higher for a given array of keywords? My vote is for website B…
Not to say that either site is incapable of conquering a competitive vertical or an array of keywords. The main distinction is the inherent differences in how each website’s referential integrity conveys a distinguishable preference for specific keywords vs. taking a wild-west approach and crossing your fingers and simply swapping out meta tags and expecting dramatic shifts in rankings.
Ironically, site architecture and internal linking are often thought of last in the process of scaling a website; much like finishing a building, then forgetting to add the plumbing or electrical; it will cost you more to improvise and create workarounds than just starting fresh (or at least creating a new site segment of beacon) to consolidate a tiered and structured approach to rankings.
Rankings are a byproduct of (1) a programming platform that is conducive of adding relevant content (2) a pliable content management system or pliable array of templates to accommodate the market focus of each page and (3) the extent of the internal linking and cross-linking of pages, segments or off page inbound links from other sites (deep links) to validate the degree of peer review and internal relevance capable of conveying a semantic structure to search engines.
Falter from this model and you may have a strong site that ranks for everything you don’t want, with the inability to capture the more competitive keywords and key phrases that have the highest potential for return on investment and positioning for a given market or market segment.
You have 100% control over your own website, also, there is no penalty for creating content, however the extent of indexation depends on how unique the content is, what the degrees of consumption are from search engines, social media and other online sources as well.
The main takeaway here is that your own website after it reaches a climactic threshold (based on keyword co-occurrence) can provide vital ranking factor to other pages and provide buoyancy with a fraction of external links from outside sources.
If you understand the depth of this implication, with content, your website can cross that tipping point and (a) serve as a primary destination from search engines (after they crawl and index your content) as well as (b) those pages can provide primary ranking factor for new pages in the site.
This is how Wikipedia dominates the web (1) a powerful CMS (2) relevant content reinforced by strong internal linking and (3) an aggregate collection of deep links (links from other websites) to specific pages with themed / related keywords which correspond to the on page content.
Sure, you could build a website the old fashion way and rely on links to the homepage (to carry the site and rankings) or, you could diversify the link equity in the site, create a topical hierarchy and consume the web one keyword cluster at a time.
So, the premise of links in to links out is more than just a catchy saying, if you have strong internal links feeding a page, you should limit the amount of times you link out from that page to unrelated content or allow the navigation to seep potential link flow away from vital pages.
The more links that you have leaving a page, the less ranking factor that page has (unless augmented by other internal or external site links). So, the moral of the story is, make each link count.