The same applies to internal copies if you duplicate content on your own site intentionally or as a result of error Google will recognize only one that it selects as the original. The rest will not be indexed and this is normal. Therefore it is worth avoiding duplication of content. by carefully planning your SEO strategy. Here we have two aspects the first is what is happening on the server side. If the hosting is slow and the page code is not optimized for speed the robots will prefer to let go and come back at another time. It is worth noting that if this happens repeatedly Google may reduce the priority of your website and visit it much less often.
Another issue is page rendering. Some robots literally act like a user and not just peek at the code they even render display the page like a user using Chrome for mobile Latest Mailing Database devices. If the page loading process page speed by the browser is slow it's another signal to slow down the scanning and indexing process a bit. To remedy this just focus on Core Web Vitals. JS rendering issues Some pages are built using modern. Google can't interact with the page can't clickscroll JS issues are always difficult to diagnose but their elimination will be necessary if you want your site to be visible on Google.