My problem got solved, you can use the non indexed links in new posts, so the Google bot can crawl that page too. It's better to add like read this article or read this sports too.
Suggest those links in this way you can get good ranking for your website.
Thats why google say that sitemap is important because this types of issues are due to interlinking of website ..... For example if your home page are linking to five page then first these five pages will index and after few days or month search engines will index other. If sitemap at site then chance to index all other pages at single time when bots will be at your site.
If some particular pages are not being listed by Google, then it's most likely some access issue. Usually websites consider crawlers as just guests.
So, if you allow only members to access the content, then you will have to change the user group of the crawlers.
Only then crawlers can access the content. Also, if somehow this is being restricted by robots.txt file, then you will have to change it. Usually robots.txt file determines the access of the crawlers, for a website.
I haven't heard any issue aboutb crawling and indexing my website. When I started with my blog and about to index it all I needed to do was to enter Google search console and input my domain name for verification after verifying my domain I was instructed to add my permalink and that's how it was done.
This is the first time that I am hearing about such a problem to be honest. Have you tried perhaps to contact them directly and ask about this issue? Perhaps this can be something that can be resolved on their side without you having to do anything, o it could be an error in your website development.