|We don't bite newbies here... much|
Greetings Wise Monks
Over the years I have created and maintained a crawler for a customer. As of late a few sites that they crawl various Parser modules (pullparser/tokeparser) have been getting noisy. (warnings) with this noise "Parsing of undecoded UTF-8 will give garbage when decoding entities"
I have done all the searches and tried all the solutions that I have found from simply using utf8_mode(1) to encoding/decoding the page before I pass it to the parser to no avail.
I have saved the pages to disk to examine them and they seem fine. We are using curl::multi to get the pages, and all is well for all but a couple sites. These sites are using a CDN as they are quite large, though, this is the norm for a number of their clients and we don't have the same warnings
I'm not even sure how to ask how to fix what I don't know is broken. I suppose the right question is what can someone suggest to help me track down the WHY it's being noisy, when the pages seem to be utf8 encoded properly. The utf8 flag is on ie.
$page in is the raw html document, it appears in all testing to be properly encoded and not mixed etc etc.
Is it something simple that I have overlooked or misunderstood, and why is it not on every page? it's generated by a cms, but it barks only so many x pages....
Thank you in advance for any wisdom that is rendered. I've run out of ideas.
I have looked here, tried everything, searched on google, tried every possible solution I could understand to try (must be one that I have missed, or not understood)